New Report: Do These 6 AI Voice Cloning Companies Do Enough to Prevent Misuse?

Today, Consumer Reports (CR) released a new assessment of AI voice cloning products, uncovering gaps in safeguards designed to prevent fraud and misuse. Voice cloning technology can streamline tasks like audio editing and narration, but it also presents a serious risk—scammers have already used AI-generated voices to impersonate family members in distress, as well as celebrities and politicians in deceptive schemes.

The CR report examines popular voice cloning products offered by six companies—Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify—and finds that four fail to take basic steps to stop unauthorized voice cloning. 

To combat these risks, CR is calling on companies to strengthen safeguards, such as adding a technical mechanism to confirm a speaker’s consent to having their voice cloned.   CR is also urging stronger enforcement of existing consumer protection laws; the paper finds that products with minimal safeguards may run afoul of existing consumer protection laws. Additional policy recommendations on AI can be found here.

For more details, read the full report here

Get the latest on Innovation at Consumer Reports

Sign up to stay informed

We care about the protection of your data. Read our Privacy Policy